Google Search Console (GSC) API is a widely used source of website search performance data. Still, it can never supply you with any data on competitor websites. What makes matters worse, some of the latest changes once again tell us you can’t always lean on Google. Now and again, they overhaul the data their tools and APIs furnish, tending to lock down the latter even if improving the former.
In this article, we’ll explain everything there is to know about Search Console API, including what stays, what’s changed, and how to get more data using alternative solutions.
Many SEOs and developers rely on the Search Console API to fetch data beyond the 1,000 rows limit of the native interface and get a complete picture of website progress in search results. But, frustrating as it is, GSC API cannot give you as much data as you truly need for comprehensive SEO analysis.
- Sitemaps source – allows getting information about a specific sitemap, retrieving a list of sitemaps submitted for a site, submitting or deleting a sitemap.
- Sites source – gives access to information about a site in SC and users’ permission levels for the site, as well as allows adding or removing a site from user’s SC sites.
- Search Analytics API – supplies vital data for landing pages and queries that bring traffic to a website: impressions, clicks, CTR, average position.
Search Analytics API is often used to query keyword data for detailed ranking and traffic reports. For example, you can explore pages alongside keywords they rank for and vice versa to assess potential traffic growth based on impressions, clicks, and CTR. The API allows retrieving data for a site in total, for separate pages, or for separate queries. You can get historical data for up to 16 months and filter it by search type (news, video, image, web), countries, devices (mobile, desktop, tablet), queries, or pages. However, you cannot group data by the same dimension twice.
Integration with Search Analytics API can certainly bring an additional layer of insights into your reports, and we’re not saying you should call it off. However, we are saying that you could push some boundaries with a more versatile data source.
Of course, search performance data is available through Search Analytics API only if you have access to the domain property of a website in Search Console. But no good SEO analytics and strategic planning can be done without competitor research.
Here at DataForSEO, we make competitor data accessible via API. You can get keywords, rankings, and estimated search traffic data for any website with DataForSEO Labs API, namely:
- Ranked Keywords and related SERP elements for any domain
- SERP Competitors for up to 200 keywords specified in a single request
- Relevant Pages and Subdomains of your competitors with rank, impressions, and traffic data
- Domain Intersection keywords for which both specified domains rank within the same SERP
- Domain Rank Overview of any site with rank and traffic data from organic and paid search
- Competitor Domains to the target domain with their ranking and traffic data
- Page Intersection keywords several pages rank for
With each endpoint, you will get rich sets of keyword metrics based on Google Ads data, including:
- Cost per click
- Product and service categories
- Monthly search trend
- Daily impressions
- Daily clicks and daily cost
Besides that, we provide two perspectives on estimated website traffic: based on search volume and based on impressions. We also calculate the estimated cost of the monthly search traffic for both organic and paid keywords. Furthermore, we have advanced filtering and sorting parameters that you can learn more about here.
As for the cost, we charge $0.01 for setting a task and $0.0001 for every data row you will receive in the results array. For example, if you top up your account for $50, you can make around 454 requests to DataForSEO Labs API and get 1,000 data rows returned for each.
If you make a payment of $1,000, you’ll get a bonus of $250 on your account. In that case, you’ll be able to make around 11,363 requests to DataForSEO Labs API and get 1,000 data rows returned for each. In case you indicate the maximum 1000 data rows as a limit, but actually obtain fewer data rows in the response, our system will return the excessive amount of money back to your account.
Crawl Errors API used to shed light on up to 100,000 URLs for each error and supplied more details than the Search Console report could fit in. But last year, Google was moving and removing some of the old features in the GSC reports. While the crawl errors were relocated to the Index Coverage report in the SC interface, API access to them was shut down:
“Along with the Crawl Errors report, we’re also deprecating the crawl errors API that’s based on the same internal systems. At the moment, we don’t have a replacement for this API.”
Google Search Central Blog
To the day, Google provides no substitute for Crawl Errors API. Such a change on Google’s end made many SEOs and developers, who relied on it, look for workarounds and alternative ways to still get that data.
For example, Yoast started planning out integrations with other SEO platforms and building an import tool that would allow software users to manually upload error reports from SC into Yoast. Even though these can be possible ways out, the latter option most likely won’t be viable because it’s not really convenient and won’t suit owners of really large websites.
As for integrations, they are undoubtedly key to merging many moving parts of SEO together. On the other hand, they often mean that users will have to subscribe to these other platforms as well, while Yoast could save more additional customer payments for themselves if they introduced a proprietary system for checking website errors. Launching such an option does not necessarily mean spending tons of time and resources to develop a crawler internally.
Upon Google’s announcement of the evergreen Googlebot in May 2019, we have integrated the Chromium project library and regularly update the Chrome Version used for rendering.
Besides all of the above, our API surfaces spell-check errors and suggestions and scans page content to provide you with five different readability index values:
- Automated Readability Index
- Coleman–Liau Index
- Dale–Chall Readability Index
- Flesch–Kincaid Readability Index
- SMOG Readability Index.
The API also returns page load metrics, including time to interactive, time to load resources, time to first byte, and time to complete downloading the HTML resource. By the way, you can always obtain the raw HTML of the crawled page.
The price per one crawled page is $0.001625 if you enable all features, and $0.000125 if you use basic features only. You can also use specific features separately. Then, the price per crawled page will be:
- $0.000375 with load resources
For example, if you top up the account for $50, you can have:
At the same time, for a payment of $1000, you’ll get additional $250 for your account balance, and those $1250 will be enough for:
Google Search Console API currently offers limited possibilities:
- Running mobile-friendliness tests
- Retrieving impressions, clicks, CTR, average position for landing pages and queries
- Managing sitemaps and users’ permission levels in SC
Sure enough, insights into competitor performance are the most important missing puzzle pieces for comprehensive search analytics. Yet another frustrating point is that Crawl Errors API was deprecated. In the end, Google is leaving SEOs and developers no other options but to look for alternative ways to obtain the lacking data.